Strong Local Convergence Properties of Adaptive Regularized Methods for Nonlinear Least-squares
نویسنده
چکیده
This paper studies adaptive regularized methods for nonlinear least-squares problems where the model of the objective function used at each iteration is either the Euclidean residual regularized by a quadratic term or the Gauss-Newton model regularized by a cubic term. For suitable choices of the regularization parameter the role of the regularization term is to provide global convergence. In this paper we investigate the impact of the regularization term on the local convergence rate of the methods and establish that, under the well-known error bound condition, quadratic convergence to zero-residual solutions is enforced. This result extends the existing analysis on the local convergence properties of adaptive regularized methods. In fact, the known results were derived under the standard full rank condition on the Jacobian at a zero-residual solution while the error bound condition is weaker than the full rank condition and allows the solution set to be locally nonunique.
منابع مشابه
Superlinearly convergent exact penalty projected structured Hessian updating schemes for constrained nonlinear least squares: asymptotic analysis
We present a structured algorithm for solving constrained nonlinear least squares problems, and establish its local two-step Q-superlinear convergence. The approach is based on an adaptive structured scheme due to Mahdavi-Amiri and Bartels of the exact penalty method of Coleman and Conn for nonlinearly constrained optimization problems. The structured adaptation also makes use of the ideas of N...
متن کاملHarmonics Estimation in Power Systems using a Fast Hybrid Algorithm
In this paper a novel hybrid algorithm for harmonics estimation in power systems is proposed. The estimation of the harmonic components is a nonlinear problem due to the nonlinearity of phase of sinusoids in distorted waveforms. Most researchers implemented nonlinear methods to extract the harmonic parameters. However, nonlinear methods for amplitude estimation increase time of convergence. Hen...
متن کاملConvergence of a Regularized Euclidean Residual Algorithm for Nonlinear Least-Squares
The convergence properties of the new Regularized Euclidean Residual method for solving general nonlinear least-squares and nonlinear equations problems are investigated. This method, derived from a proposal by Nesterov (2007), uses a model of the objective function consisting of the unsquared Euclidean linearized residual regularized by a quadratic term. At variance with previous analysis, its...
متن کاملGlobal and Local Convergence of a Levenberg-Marquadt Algorithm for Inverse Problems
The Levenberg-Marquardt algorithm is one of the most popular algorithms for the solution of nonlinear least squares problems. In this paper, we propose and analyze the global and local convergence results of a novel Levenberg-Marquadt method for solving general nonlinear least squares problems. The proposed algorithm enjoys strong convergence properties (global convergence as well as quadratic ...
متن کاملAsymptotic Properties of Nonlinear Least Squares Estimates in Stochastic Regression Models Over a Finite Design Space. Application to Self-Tuning Optimisation
We present new conditions for the strong consistency and asymptotic normality of the least squares estimator in nonlinear stochastic models when the design variables vary in a finite set. The application to self-tuning optimisation is considered, with a simple adaptive strategy that guarantees simultaneously the convergence to the optimum and the strong consistency of the estimates of the model...
متن کامل